Feedforward Neural Network Initialization: an Evolutionary Approach
نویسندگان
چکیده
The initial set of weights to be used in supervised learning for multilayer neural networks has a strong influence in the learning speed and in the quality of the solution obtained after convergence. An inadequate initial choice of the weight values may cause the training process to get stuck in a poor local minimum or to face abnormal numerical problems. Nowadays, there are several proposed techniques that try to avoid both local minima and numerical instability, only by means of a proper definition of the initial set of weights. The focus of this paper is in the application of genetic algorithms (GA) as a tool to analyze the space of weights, in order to achieve good initial conditions for supervised learning. GA’s almost-global sampling compliments connectionist local search techniques well, and allows us to find some very important characteristics in the initial set of weights for multilayer networks. The results presented are compared, for a set of benchmarks, with that produced by other approaches found in the literature.
منابع مشابه
Mapping Some Functions and Four Arithmetic Operations to Multilayer Feedforward Neural Networks
This paper continues the development of a heuristic initialization methodology for designing multilayer feedforward neural networks aimed at modeling nonlinear functions for engineering mechanics applications as presented previously at IMAC XXIV and XXV. Seeking a transparent and domain knowledge-based approach for neural network initialization and result interpretation, this study examines the...
متن کاملAn Initialization Method for Feedforward Artificial Neural Networks Using Polynomial Bases
We propose an initialization method for feedforward artificial neural networks (FFANNs) trained to model physical systems. A polynomial solution of the physical system is obtained using a mathematical model and then mapped into the neural network to initialize its weights. The network can next be trained with a dataset to refine its accuracy. We focus attention on an elliptical partial differen...
متن کاملInitialization of Recurrent Networks Using Fourier Analysis
Time-Delayed recurrent neural network models preserve information through time and are more powerful than the static feedforward networks, especially in dynamic problems. At present, recurrent networks are mostly formulated as nonlinear autoregression models [4] when applied to time series prediction problem. In this paper, we use a novel approach to interpret the recurrent networks. We build t...
متن کاملA Hybrid Differential Evolution and Back-Propagation Algorithm for Feedforward Neural Network Training
In this study a hybrid differential evolution-back-propagation algorithm to optimize the weights of feedforward neural network is proposed.The hybrid algorithm can achieve faster convergence speed with higher accuracy. The proposed hybrid algorithm combining differential evolution (DE) and back-propagation (BP) algorithm is referred to as DE-BP algorithm to train the weights of the feed-forward...
متن کاملEvolving Neural Networks for Hang Seng Stock Index Forecast
This paper describes an evolutionary neural network approach to Hang Seng stock index forecast. In this approach, a feedforward neural network is evolved using an evolutionary programming algorithm. Both the weights and architectures (i.e., connectivity of the network) are evolved in the same evolutionary process. The network may grow as well as shrink. The experimental results show that the ev...
متن کاملTraining Feedforward Neural Networks with Standard Logistic Activations is Feasible
Training feedforward neural networks with standard logistic activations is considered difficult because of the intrinsic properties of these sigmoidal functions. This work aims at showing that these networks can be trained to achieve generalization performance comparable to those based on hyperbolic tangent activations. The solution consists on applying a set of conditions in parameter initiali...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998